search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'with articles ' . Output generated live by software developed by IusOnDemand srl


expand index with articles :

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms

    SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks

    SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas with articles :


definitions:


cloud tag: and the number of total unique words without stopwords is: 421

 

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms_and_conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms_and_conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms_and_conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 55

Activity reports

1.   Digital Services Coordinators shall draw up annual reports on their activities under this Regulation, including the number of complaints received pursuant to Article 53 and an overview of their follow-up. The Digital Services Coordinators shall make the annual reports available to the public in a machine-readable format, subject to the applicable rules on the confidentiality of information pursuant to Article 84, and shall communicate them to the Commission and to the Board.

2.   The annual report shall also include the following information:

(a)

the number and subject matter of orders to act against illegal_content and orders to provide information issued in accordance with articles 9 and 10 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;

(b)

the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 9 and 10.

3.   Where a Member State has designated several competent authorities pursuant to Article 49, it shall ensure that the Digital Services Coordinator draws up a single report covering the activities of all competent authorities and that the Digital Services Coordinator receives all relevant information and support needed to that effect from the other competent authorities concerned.

SECTION 2

Competences, coordinated investigation and consistency mechanisms


whereas









keyboard_arrow_down